169 research outputs found

    Addiction to car use and dynamic elasticity measures in France

    Get PDF
    This article presents a microeconometric analysis of the annual mileage travelled by French households with their personal cars, defining their automobility. To feature car use dependence, the rational addiction model of Becker et al. (1994) is applied on a panel dataset, drawn from the French "Car Fleet" survey over the period 1999-2001. Importantly, the estimates show that the assumption of addiction to car use cannot be rejected. Furthermore, the model yields realistic kilometric-price and income elasticities of household automobility, for both the short and the long runs.Transportation ; Car use ; Consumption ; Addiction ; Panel ; GMM

    Next-Generation Model-based Variability Management: Languages and Tools

    Get PDF
    International audienceVariability modelling and management is a key activity in a growing number of software engineering contexts, from software product lines to dynamic adaptive systems. Feature models are the defacto standard to formally represent and reason about commonality and variability of a software system. This tutorial aims at presenting next generation of feature modelling languages and tools, directly applicable to a wide range of model-based variability problems and application domains. Participants (being practitioners or academics, beginners or advanced) will learn the principles and foundations of languages and tool-supported techniques dedicated to the model-based management of variability

    Utilization of temperature kinetics as a method to predict treatment intensity and corresponding treated wood quality : durability and mechanical properties of thermally modified wood

    Get PDF
    Wood heat treatment is an attractive alternative to improve decay resistance of wood species with low natural durability. However, this improvement of durability is realized at the expense of the mechanical resistance. Decay resistance and mechanical properties are strongly correlated to thermal degradation of wood cells wall components. Mass loss resulting from this degradation is a good indicator of treatment intensity and final treated wood properties. However, the introduction of a fast and accurate system for measuring this mass loss on an industrial scale is very difficult. Nowadays, many studies are conducted on the determination of control parameters which could be correlated with the treatment conditions and final heat treated wood quality such as decay resistance. The aim of this study is to investigate the relations between kinetics of temperature used during thermal treatment process representing heat treatment intensity, mass losses due to thermal degradation and conferred properties to heat treated wood. It might appear that relative area of treatment temperature curves is a good indicator of treatment intensity. Heat treatment with different treatment conditions (temperature-time) have been performed under vacuum, on four wood species (one hardwood and three softwoods) in order to obtain thermal degradation mass loses of 8, 10 and 12%. For each experiment, relative areas corresponding to temperature kinetics, mass loss, decay resistance and mechanical properties have been determined. Results highlight the statement that the temperature curves’ area constitutes a good indicator in the prediction of needed treatment intensity, to obtain required wood durability and mechanical properties such as bending resistance and Brinell hardness.LERMaB is supported by the French National Research Agency through the Laboratory of Excellence ARBRE (ANR-12- LABXARBRE-01), the authors gratefully acknowledge this ai

    Composing Feature Models

    Get PDF
    International audienceFeature modeling is a widely used technique in Software Product Line development. Feature models allow stakeholders to describe domain concepts in terms of commonalities and differences within a family of software systems. Developing a complex monolithic feature model can require significant effort and restrict the reusability of a set of features already modeled. We advocate using modeling techniques that support separating and composing concerns to better manage the complexity of developing large feature models. In this paper, we propose a set of composition operators dedicated to feature models. These composition operators enable the development of large feature models by composing smaller feature models which address well-defined concerns. The operators are notably distinguished by their documented capabilities to preserve some significant properties

    Imaging Services on the Grid as a Product Line: Requirements and Architecture

    Get PDF
    International audienceSOA is now the reference architecture for medical imaging processing on the grid. Imaging services must be composed in workfows to implement the processing chains, but the need to handle end-to-end qualities of service hampered both the provision of services and their composition. This paper analyses the variability of functional and non functional aspects of this domain and proposes a first architecture in which services are organized within a product line architecture and metamodels help in structuring necessary information

    Separation of Concerns in Feature Modeling: Support and Applications

    Get PDF
    International audienceFeature models (FMs) are a popular formalism for describing the commonality and variability of software product lines (SPLs) in terms of features. SPL development increasingly involves manipulating many large FMs, and thus scalable modular techniques that support compositional development of complex SPLs are required. In this paper, we describe how a set of complementary operators (aggregate, merge, slice) provides practical support for separation of concerns in feature modeling. We show how the combination of these operators can assist in tedious and error prone tasks such as automated correction of FM anomalies, update and extraction of FM views, reconciliation of FMs and reasoning about properties of FMs. For each task, we report on practical applications in different domains. We also present a technique that can efficiently decompose FMs with thousands of features and report our experimental results

    Numerical simulation and experimental validation of gap supported tube subjected to fluid-elastic coupling forces for hybrid characterization tests

    No full text
    International audienceIn steam generators, the primary loop tubes are subjected to fluid coupling forces and impacts. Understanding the behavior of these tubes is crucial when designing steam generators. In fact, it can afford an optimization of produced energy and a long average life of the structure. Up to now, the effect of the coupling forces on structural behavior was identified on reduced scale structures. Thus, the aim of our research is to give a better understanding of stabilizing effects of shock and coupling with fluid elastic forces. In order to validate numerical investigations, since fluid elastic forces are difficult simulate and expensive to reproduce experimentally, the fluid coupling forces will be assumed to be represented using velocity dependant (fluid and structure) damping and stiffness matrices, and experimentally reproduced using active vibration control into hybrid experimental tests to simplify big structure characterization. In this paper, a method for modeling the structure behavior in order to estimate the effects of the coupling between the fluid elastic forces and impacts is presented. This strategy implies lower costs and avoids difficulties associated to the case of fluid in the experiments. This model will be implemented in the active control loop in the next step of the study

    Composing Multiple Variability Artifacts to Assemble Coherent Workflows

    Get PDF
    International audienceThe development of scientific workflows is evolving towards the system- atic use of service oriented architectures, enabling the composition of dedicated and highly parameterized software services into processing pipelines. Building consistent workflows then becomes a cumbersome and error-prone activity as users cannot man- age such large scale variability. This paper presents a rigorous and tooled approach in which techniques from Software Product Line (SPL) engineering are reused and ex- tended to manage variability in service and workflow descriptions. Composition can be facilitated while ensuring consistency. Services are organized in a rich catalog which is organized as a SPL and structured according to the common and variable concerns captured for all services. By relying on sound merging techniques on the feature mod- els that make up the catalog, reasoning about the compatibility between connected services is made possible. Moreover, an entire workflow is then seen as a multiple SPL (i.e., a composition of several SPLs). When services are configured within, the prop- agation of variability choices is then automated with appropriate techniques and the user is assisted in obtaining a consistent workflow. The approach proposed is com- pletely supported by a combination of dedicated tools and languages. Illustrations and experimental validations are provided using medical imaging pipelines, which are rep- resentative of current scientific workflows in many domains

    Opening the Software Engineering Toolbox for the Assessment of Trustworthy AI

    Get PDF
    Trustworthiness is a central requirement for the acceptance and success of human-centered artificial intelligence (AI). To deem an AI system as trustworthy, it is crucial to assess its behaviour and characteristics against a gold standard of Trustworthy AI, consisting of guidelines, requirements, or only expectations. While AI systems are highly complex, their implementations are still based on software. The software engineering community has a long established toolbox for the assessment of software systems, especially in the context of software testing. In this paper, we argue for the application of software engineering and testing practices for the assessment of trustworthy AI. We make the connection between the seven key requirements as defined by the European Commission’s AI high-level expert group and established procedures from software engineering and raise questions for future work.publishedVersio
    • …
    corecore